|
Inequalities are very important in the study of information theory. There are a number of different contexts in which these inequalities appear. ==Shannon-type inequalities== Consider a finite collection of finitely (or at most countably) supported random variables on the same probability space. For a collection of ''n'' random variables, there are 2''n'' − 1 such non-empty subsets for which entropies can be defined. For example, when ''n'' = 2, we may consider the entropies and and express the following inequalities (which together characterize the range of the marginal and joint entropies of two random variables): * * * * * In fact, these can all be expressed as special cases of a single inequality involving the conditional mutual information, namely : where , , and each denote the joint distribution of some arbitrary (possibly empty) subset of our collection of random variables. Inequalities that can be derived from this are known as Shannon-type inequalities. More formally (following the notation of Yeung 〔)〕), define to be the set of all ''constructible'' points in where a point is said to be constructible if and only if there is a joint, discrete distribution of ''n'' random variables such that each coordinate of that point, indexed by a non-empty subset of , is equal to the joint entropy of the corresponding subset of the ''n'' random variables. The closure of is denoted In general : The cone in characterized by all Shannon-type inequalities among ''n'' random variables is denoted Software has been developed to automate the task of proving such inequalities . Given an inequality, such software is able to determine whether the given inequality contains the cone in which case the inequality can be verified, since 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「inequalities in information theory」の詳細全文を読む スポンサード リンク
|